Optimizing Echo State Networks for Static Pattern Recognition
نویسندگان
چکیده
منابع مشابه
Echo State Networks for Arabic Phoneme Recognition
This paper presents an ESN-based Arabic phoneme recognition system trained with supervised, forced and combined supervised/forced supervised learning algorithms. Mel-Frequency Cepstrum Coefficients (MFCCs) and Linear Predictive Code (LPC) techniques are used and compared as the input feature extraction technique. The system is evaluated using 6 speakers from the King Abdulaziz Arabic Phonetics ...
متن کاملMaximum Echo-State-Likelihood Networks for Emotion Recognition
Maximum Echo-State-Likelihood Networks for Emotion Recognition Edmondo Trentin, Stefan Scherer, aand Friedhelm Schwenker Evaluation of Feature Selection by Multiclass Kernel Discriminant Analysis Tsuneyoshi Ishii and Shigeo Abe Correlation-Based and Causal Feature Selection Analysis for Ensemble Classifiers Rakkrit Duangsoithong and Terry Windeatt A New Monte Carlo-based Error Rate Estimator Ah...
متن کاملEvolutionary Optimization of Echo State Networks: Multiple Motor Pattern Learning
Echo State Networks are a special class of recurrent neural networks, that are well suited for attractor-based learning of motor patterns. Using structural multi-objective optimization, the trade-off between network size and accuracy can be identified. This allows to choose a feasible model capacity for a follow-up full-weight optimization. Both optimization steps can be combined into a nested,...
متن کاملPattern Recognition and Learning Based Self Optimizing Networks
Self Optimizing Networks (SON) are becoming increasingly relevant as demand for wireless capacity increases and mobile networks get more complex with a large number of parameters and attributes needing to be optimized. These SON processes work in an autonomous, closed-loop mode – continuously monitoring the network through measurements made by the infrastructure and user equipment (UE), identif...
متن کاملRestricted Echo State Networks
Echo state networks are a powerful type of reservoir neural network, but the reservoir is essentially unrestricted in its original formulation. Motivated by limitations in neuromorphic hardware, we remove combinations of the four sources of memory—leaking, loops, cycles, and discrete time—to determine how these influence the suitability of the reservoir. We show that loops and cycles can replic...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Cognitive Computation
سال: 2017
ISSN: 1866-9956,1866-9964
DOI: 10.1007/s12559-017-9468-2